Goto

Collaborating Authors

 sue drone maker


EU proposes rules making it easier to sue drone makers, AI systems

#artificialintelligence

BRUSSELS, Sept 28 (Reuters) - The European Commission on Wednesday proposed rules making it easier for individuals and companies to sue makers of drones, robots and other products equipped with artificial intelligence software for compensation for harm caused by them. The AI Liability Directive aims to address the increasing use of AI-enabled products and services and the patchwork of national rules across the 27-country European Union. Under the draft rules, victims can seek compensation for harm to their life, property, health and privacy due to the fault or omission of a provider, developer or user of AI technology, or for discrimination in a recruitment process using AI. "We want the same level of protection for victims of damage caused by AI as for victims of old technologies," Justice Commissioner Didier Reynders told a news conference. The rules lighten the burden of proof on victims with a "presumption of causality", which means victims only need to show that a manufacturer or user's failure to comply with certain requirements caused the harm and then link this to the AI technology in their lawsuit.


EU Draft Rules Would Make It Easier to Sue Drone Makers, AI Systems

#artificialintelligence

Individuals and companies that suffer harm from drones, robots and other products or services equipped with artificial intelligence software will find it easier to sue for compensation under EU draft rules seen by Reuters. The AI Liability Directive, which the European Commission will announce on Wednesday, aims to address the increasing proliferation of AI-enabled products and services and the patchwork of national rules across the 27-country European Union. Victims can sue for compensation for harm to their life, property, health and privacy due to the fault or omission of a provider, developer or user of AI technology or was discriminated in a recruitment process using AI, the draft rules said. The rules seek to lighten the burden of proof on victims by introducing a "presumption of causality," which means victims only need to show that a manufacturer or user's failure to comply with certain requirements caused the harm and then link this to the AI technology in their lawsuit. Under a "right of access to evidence," victims can ask a court to order companies and suppliers to provide information about high-risk AI systems so that they can identify the liable person and find out what went wrong.